Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Free, publicly-accessible full text available February 1, 2027
-
Free, publicly-accessible full text available December 1, 2026
-
Wildfires pose an escalating risk to communities and infrastructure, especially in regions undergoing increased fuel dryness and temperature extremes driven by climate change, as well as continued expansion into the wildland-urban interface (WUI). Probabilistic wildfire risk assessment provides a rigorous means of quantifying potential impacts, but its application is often hindered by the high computational cost of working with hundreds of thousands of complex wildfire scenarios. This study introduces a novel scenario reduction framework tailored to the unique characteristics of wildfire hazards, which often lack standard intensity metrics and exhibit highly nonlinear, spatially distributed behavior. The proposed framework selects a subset of scenarios that best represent the spatial and statistical diversity of the full dataset, thereby greatly reducing computational costs while accounting for uncertainties. This is achieved by mapping complex wildfire scenarios into a high-dimensional feature space, enabling similarity assessments based on spatial consequence patterns rather than standard intensity metrics. A k-medoids clustering approach is then used to identify a representative subset of scenarios, while an active-learning-based outlier selection procedure incorporates rare but high-impact events without inflating computational demands. The framework was first demonstrated using a simple illustrative example to show how its performance responds to different data characteristics. To further demonstrate the practicality of the framework, it was used for wildfire risk assessment in Spokane County, Washington, where the full dataset (1000 scenarios) was reduced to 41 representative scenarios while preserving the spatial patterns of burn probability and building damage with high fidelity. The results demonstrated that the framework significantly improves computational efficiency and accuracy compared to traditional scenario reduction methods, offering a scalable and flexible tool for probabilistic wildfire risk assessment.more » « lessFree, publicly-accessible full text available January 12, 2027
-
Free, publicly-accessible full text available December 25, 2026
-
Navigating dilemmas involving conflicting values is challenging even for humans in high-stakes domains, let alone for AI, yet prior work has been limited to everyday scenarios. To close this gap, we introduce CLASH (Character perspective-based LLM Assessments in Situations with High-stakes), a meticulously curated dataset consisting of 345 high-impact dilemmas along with 3,795 individual perspectives of diverse values. CLASH enables the study of critical yet underexplored aspects of value-based decision-making processes, including understanding of decision ambivalence and psychological discomfort as well as capturing the temporal shifts of values in the perspectives of characters. By benchmarking 14 non-thinking and thinking models, we uncover several key findings. (1) Even strong proprietary models, such as GPT-5 and Claude-4-Sonnet, struggle with ambivalent decisions, achieving only 24.06 and 51.01 accuracy. (2) Although LLMs reasonably predict psychological discomfort, they do not adequately comprehend perspectives involving value shifts. (3) Cognitive behaviors that are effective in the math-solving and game strategy domains do not transfer to value reasoning. Instead, new failure patterns emerge, including early commitment and overcommitment. (4) The steerability of LLMs towards a given value is significantly correlated with their value preferences. (5) Finally, LLMs exhibit greater steerability when reasoning from a third-party perspective, although certain values (e.g., safety) benefit uniquely from first-person framing.more » « lessFree, publicly-accessible full text available January 1, 2027
-
Free, publicly-accessible full text available December 1, 2026
-
During plate convergence, shallow subduction or underthrusting of the lower-plate lithosphere beneath an overriding plate often results in far-field intraplate deformation, as observed in the Late Cretaceous–Paleogene North American Laramide or Cenozoic Himalayan-Tibetan orogen. Perplexingly, during this shallow-slab process, wide expanses of crust between the plate boundary and intraplate orogen do not experience significant synchronous deformation. These apparently undeformed crustal regions may reflect (1) a strong, rigid plate, (2) increased gravitational potential energy (GPE) to resist shortening and uplift, or (3) decoupling of the upper-plate lithosphere from any basal tractions. Here we review the geology of three orogens that formed due to flat slab subduction or underthrusting: the Himalayan-Tibetan, Mesozoic southeast China, and Laramide orogens. These orogens all involved intraplate deformation >1000-km from the plate boundary, large regions of negligible crustal shortening between the plate-boundary and intra-plate thrust belts, hot crustal conditions within the hinterland regions, and extensive upper-plate porphyry copper mineralization. A hot and weak hinterland is inconsistent with it persisting as an undeformed rigid block. GPE analysis suggests that hinterland quiescence is not uniquely due to thickened crust and elevated GPE, as exemplified by shallow marine sedimentation with low surface elevations in SE China. Comparison of these intracontinental orogens allows us to advance a general model, where hot orogenic hinterlands with a weak, mobile lower crust allow decoupling from underlying basal tractions exerted from flat-slab or underthrusting events. This hypothesis suggests that basal tractions locally drive intraplate orogens, at least partially controlled by the strength of the upper-plate lithosphere.more » « lessFree, publicly-accessible full text available January 1, 2027
-
Abstract The transient behavior of rate-dependent adhesion in poro-viscoelastic contact is more complex than crack propagation in Mode I opening due to time-dependent material behavior, crack acceleration from nonlinear kinematics, and variation in contact radius. This study revisits our previous experiment, where a spherical glass probe is unloaded on flat gelatin, and investigates crack velocity ($$V_\text {c}$$ ) and energy release rate (ERR). For a given unloading rate,$$V_\text {c}$$ increases monotonically by one order of magnitude, and the wide range of unloading rates ensures that$$V_\text {c}$$ spans 3–4 orders of magnitude. ERR remains almost unchanged at 2–3 times the thermodynamic work of adhesion at slow rates. At fast rates, ERR initially increases to 4–8, then decreases until full separation. We hypothesize that the decreasing ERR trend is due to finite-size effects: the hysteretic energy dissipation zone grows with crack acceleration, while the material volume decreases during peeling. To explain these trends and the finite-size effect, we adapt de Gennes’ viscoelastic crack propagation model, modifying it to account for crack acceleration and the reduction in contact radius. Under the given time scales (peeling time and viscoelastic relaxation time) and length scales (crack tip radius and initial contact radius), we simulate the evolution of ERR as peeling proceeds and compare the results with experimental data. The model’s results show good qualitative agreement with the experiments. Finally, we discuss the model’s limitations, assumptions, and directions for future research.more » « lessFree, publicly-accessible full text available December 1, 2026
An official website of the United States government
